Energy Efficient Sparse Connectivity from Imbalanced Synaptic Plasticity Rules
نویسندگان
چکیده
It is believed that energy efficiency is an important constraint in brain evolution. As synaptic transmission dominates energy consumption, energy can be saved by ensuring that only a few synapses are active. It is therefore likely that the formation of sparse codes and sparse connectivity are fundamental objectives of synaptic plasticity. In this work we study how sparse connectivity can result from a synaptic learning rule of excitatory synapses. Information is maximised when potentiation and depression are balanced according to the mean presynaptic activity level and the resulting fraction of zero-weight synapses is around 50%. However, an imbalance towards depression increases the fraction of zero-weight synapses without significantly affecting performance. We show that imbalanced plasticity corresponds to imposing a regularising constraint on the L1-norm of the synaptic weight vector, a procedure that is well-known to induce sparseness. Imbalanced plasticity is biophysically plausible and leads to more efficient synaptic configurations than a previously suggested approach that prunes synapses after learning. Our framework gives a novel interpretation to the high fraction of silent synapses found in brain regions like the cerebellum.
منابع مشابه
Spike Timing-Dependent Plasticity in the Address Domain
Address-event representation (AER), originally proposed as a means to communicate sparse neural events between neuromorphic chips, has proven efficient in implementing large-scale networks with arbitrary, configurable synaptic connectivity. In this work, we further extend the functionality of AER to implement arbitrary, configurable synaptic plasticity in the address domain. As proof of concept...
متن کاملMirrored STDP Implements Autoencoder Learning in a Network of Spiking Neurons
The autoencoder algorithm is a simple but powerful unsupervised method for training neural networks. Autoencoder networks can learn sparse distributed codes similar to those seen in cortical sensory areas such as visual area V1, but they can also be stacked to learn increasingly abstract representations. Several computational neuroscience models of sensory areas, including Olshausen & Field's S...
متن کاملMemory formation and recall in recurrent spiking neural networks
Our brain has the capacity to analyze a visual scene in a split second, to learn how to play an instrument, and to remember events, faces and concepts. Neurons underlie all of these diverse functions. Neurons, cells within the brain that generate and transmit electrical activity, communicate with each other through chemical synapses. These synaptic connections dynamically change with experience...
متن کاملComparison of generalized Hebbian rules for long-term synaptic plasticity
A large variety of synaptic plasticity rules have been used in models of excitatory synaptic plasticity (Brown et al., 1990). These rules are generalizations of the Hebbian rule and have some properties consistent with experimental data on long-term excitatory synaptic plasticity, but they also have some properties inconsistent with experimental data. For example, the BCM rule (Bear et al., 198...
متن کاملPlasticity of the Developing Glutamate Synapse in the Hippocampus
Synapses are highly plastic, i.e. they have the ability to change their signaling strength both in the shortand long-term (e.g. long-term potentiation LTP) in response to specific patterns of activity. In the developing brain synaptic plasticity promotes activity-dependent development, whereas in the mature brain synaptic plasticity forms the basis for learning and memory. Although both develop...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره 11 شماره
صفحات -
تاریخ انتشار 2015